Boosting and Naive Bayesian Learning
نویسنده
چکیده
Although so-called “naive” Bayesian classification makes the unrealistic assumption that the values of the attributes of an example are independent given the class of the example, this learning method is remarkably successful in practice, and no uniformly better learning method is known. Boosting is a general method of combining multiple classifiers due to Yoav Freund and Rob Schapire. This paper shows that boosting applied to naive Bayesian classifiers yields combination classifiers that are representationally equivalent to standard feedforward multilayer perceptrons. (An ancillary result is that naive Bayesian classification is a nonparametric, nonlinear generalization of logistic regression.) As a training algorithm, boosted naive Bayesian learning is quite different from backpropagation, and has definite advantages. Boosting requires only linear time and constant space, and hidden nodes are learned incrementally, starting with the most important. On the realworld datasets on which the method has been tried so far, generalization performance is as good as or better than the best published result using any other learning method. Unlike all other standard learning algorithms, naive Bayesian learning, with and without boosting, can be done in logarithmic time with a linear number of parallel computing units. Accordingly, these learning methods are highly plausible computationally as models of animal learning. Other arguments suggest that they are plausible behaviorally also.
منابع مشابه
Improving the Performance of Boosting for Naive Bayesian Classification
This paper investigates boosting naive Bayesian classiica-tion. It rst shows that boosting cannot improve the accuracy of the naive Bayesian classiier on average in a set of natural domains. By analyzing the reasons of boosting's failures, we propose to introduce tree structures into naive Bayesian classiication to improve the performance of boosting when working with naive Bayesian classiicati...
متن کاملImproving the Performance of Boosting forNaive Bayesian Classi cationKai
This paper investigates boosting naive Bayesian classiica-tion. It rst shows that boosting cannot improve the accuracy of the naive Bayesian classiier on average in a set of natural domains. By analyzing the reasons of boosting's failures, we propose to introduce tree structures into naive Bayesian classiication to improve the performance of boosting when working with naive Bayesian classiicati...
متن کاملImproving the Performance of Boosting
This paper investigates boosting naive Bayesian classiica-tion. It rst shows that boosting cannot improve the accuracy of the naive Bayesian classiier on average in a set of natural domains. By analyzing the reasons of boosting's failures, we propose to introduce tree structures into naive Bayesian classiication to improve the performance of boosting when working with naive Bayesian classiicati...
متن کاملLazy Bayesian Rules: A Lazy Semi-Naive Bayesian Learning Technique Competitive to Boosting Decision Trees
Lbr is a lazy semi-naive Bayesian classiier learning technique, designed to alleviate the attribute interdependence problem of naive Bayesian classiication. To classify a test example , it creates a conjunctive rule that selects a most appropriate subset of training examples and induces a local naive Bayesian classiier using this subset. Lbr can signii-cantly improve the performance of the naiv...
متن کاملA Study of AdaBoost with Naive Bayesian Classifiers: Weakness and Improvement
This article investigates boosting naive Bayesian classification. It first shows that boosting does not improve the accuracy of the naive Bayesian classifier as much as we expected in a set of natural domains. By analyzing the reason for boosting’s weakness, we propose to introduce tree structures into naive Bayesian classification to improve the performance of boosting when working with naive ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1997